Goto

Collaborating Authors

 infinite neural network


Infinite Neural Networks!

#artificialintelligence

TL;DR -- Recent research shows that'wide' neural networks change very little when they are trained, while'narrow' networks change the weights of their synapses dramatically. This is a consequence of the fact that those wide-nets tend to all turn into the same network, statistically. Because all initializations are the same, this reduces the'distance' a network must travel to minimize loss. So, less change occurs during training. Though this makes them train faster and with greater expressiveness, it's actually counter-productive to generalization and following symbolic constraints and analogies, as well as for concatenating sub-tasks into goals. We'll need a Mixture of Experts for those problems.